Variance Scaling for EDAs Revisited

نویسندگان

  • Oliver Kramer
  • Fabian Gieseke
چکیده

Estimation of distribution algorithms (EDAs) are derivativefree optimization approaches based on the successive estimation of the probability density function of the best solutions, and their subsequent sampling. It turns out that the success of EDAs in numerical optimization strongly depends on scaling of the variance. The contribution of this paper is a comparison of various adaptive and self-adaptive variance scaling techniques for a Gaussian EDA. The analysis includes: (1) the Gaussian EDA without scaling, but different selection pressures and population sizes, (2) the variance adaptation technique known as Silverman’s ruleof-thumb, (3) σ-self-adaptation known from evolution strategies, and (4) transformation of the solution space by estimation of the Hessian. We discuss the results for the sphere function, and its constrained counterpart.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Single parameter scaling in one-dimensional localization revisited

The variance of the Lyapunov exponent is calculated exactly in the one-dimensional Anderson model with random site energies distributed according to the Cauchy distribution. We find a new significant scaling parameter in the system, and derive an exact analytical criterion for single parameter scaling which differs from the commonly used condition of phase randomization. The results obtained ar...

متن کامل

Enhancing the Performance of Maximum-Likelihood Gaussian EDAs Using Anticipated Mean Shift

Many Estimation–of–Distribution Algorithms use maximum– likelihood (ML) estimates. For discrete variables this has met with great success. For continuous variables the use of ML estimates for the normal distribution does not directly lead to successful optimization in most landscapes. It was previously found that an important reason for this is the premature shrinking of the variance at an expo...

متن کامل

Loudness scaling revisited.

The present work was undertaken in an attempt to evaluate whether it is reasonable to expect that categorical loudness scaling can provide useful information for nonlinear hearing aid fitting. Normative data from seven scaling procedures show that the individual procedures relate the perceptual categories differently to sound level and with a substantial between-subject variance. Hearing-impair...

متن کامل

Statistical linear models in Procrustes shape space

The configuration matrix of a set of labeled landmarks is one the most used shape representations. However, it is well-known that the configuration matrix is not invariant under translation, scaling and rotation. This problem is revisited in this work where a local tangent shape space characterization at a reference shape is obtained as the null space of the subspace spanned by the reference sh...

متن کامل

Matching inductive search bias and problem structure in continuous Estimation-of-Distribution Algorithms

Research into the dynamics of Genetic Algorithms (GAs) has led to the field of Estimation–of–Distribution Algorithms (EDAs). For discrete search spaces, EDAs have been developed that have obtained very promising results on a wide variety of problems. In this paper we investigate the conditions under which the adaptation of this technique to continuous search spaces fails to perform optimization...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011